How to optimize datasets in Domo a while importing a huge (30 columns/10 Million rows) database?

AritraChaudhuri
Member
in Archive
How to optimize datasets in Domo a while importing a huge (30 columns/10 Million rows) database from
1. Progress (Openedge Database)
2. Oracle
3. PostgreSQL (ETL)
0
Comments
-
0
-
@AritraChaudhuri We really need more information. Are you using Workbench to upload the data or are you using direct connections? You mention optimization so I assume that you have a slow down that you are trying to overcome, is that happening on import or is there an issue in some transformation? How often are you uploading this data? Is the data being uploaded using a replace or an append update method?
If we can get more insight into what you are doing and what your end goal is then we can provide better options for optimization.
Thanks for being part of the Dojo.
Alex Peay
Product Manager
Domo0
Categories
- All Categories
- 2K Product Ideas
- 2K Ideas Exchange
- 1.6K Connect
- 1.3K Connectors
- 311 Workbench
- 6 Cloud Amplifier
- 9 Federated
- 3.8K Transform
- 660 Datasets
- 117 SQL DataFlows
- 2.2K Magic ETL
- 816 Beast Mode
- 3.3K Visualize
- 2.5K Charting
- 84 App Studio
- 46 Variables
- 778 Automate
- 190 Apps
- 482 APIs & Domo Developer
- 83 Workflows
- 23 Code Engine
- 41 AI and Machine Learning
- 20 AI Chat
- 1 AI Playground
- 2 AI Projects and Models
- 18 Jupyter Workspaces
- 412 Distribute
- 120 Domo Everywhere
- 281 Scheduled Reports
- 11 Software Integrations
- 145 Manage
- 141 Governance & Security
- 8 Domo Community Gallery
- 48 Product Releases
- 12 Domo University
- 5.4K Community Forums
- 41 Getting Started
- 31 Community Member Introductions
- 115 Community Announcements
- 4.8K Archive